126 research outputs found

    Dynamic update of a virtual cell for programming and safe monitoring of an industrial robot

    Get PDF
    A hardware/software architecture for robot motion planning and on-line safe monitoring has been developed with the objective to assure high flexibility in production control, safety for workers and machinery, with user-friendly interface. The architecture, developed using Microsoft Robotics Developers Studio and implemented for a six-dof COMAU NS 12 robot, established a bidirectional communication between the robot controller and a virtual replica of the real robotic cell. The working space of the real robot can then be easily limited for safety reasons by inserting virtual objects (or sensors) in such a virtual environment. This paper investigates the possibility to achieve an automatic, dynamic update of the virtual cell by using a low cost depth sensor (i.e., a commercial Microsoft Kinect) to detect the presence of completely unknown objects, moving inside the real cell. The experimental tests show that the developed architecture is able to recognize variously shaped mobile objects inside the monitored area and let the robot stop before colliding with them, if the objects are not too small

    Mechatronics versus Robotics

    Get PDF
    In Bolton, mechatronics is defined as the integration of electronics, control engineering, and mechanical engineering, thus recognizing the fundamental role of control in joining electronics and mechanics. A robot is commonly considered as a typical mechatronic system, which integrates software, control, electronics, and mechanical designs in a synergistic manner. Robotics can be considered as a part of mechatronics; i.e., all robots are mechatronic systems, but not all mechatronic systems are robots. Advanced robots usually plan their actions by combining an assigned functional task with the knowledge about the environment in which they operate. By using a simplified approach, advanced robots could be defined as mechatronic devices governed by a smart brain, placed at a higher hierarchical level. Actuators are building blocks of any mechatronic system. Such systems, however, have a huge application span, ranging from low-cost consumer applications to high-end, high-precision industrial manufacturing equipment

    Data-driven framework to improve collaborative human-robot flexible manufacturing applications

    Get PDF
    The manufacturing assembly lines of the future are foreseen to dismiss fully unmanned systems in favour of anthropocentric solutions. However, bringing in the human complexity leads to modeling and control questions that only data can answer. Moreover, many human-robot collaborative applications in flexible manufacturing involve manipulator cobots, whereas little attention is given to the role of mobile robots. This work outlines a data-driven framework, which is the core of a brand new project to be fully developed in the very next future, to let human-robot collaborative processes overcome the barriers to successful interaction, leveraging mobile and fixed-base robots

    A new HW/SW architecture to move from AGVs towards Autonomous Mobile Robots

    Get PDF
    This paper proposes the basic concepts of a brand new HW/SW architecture, whose development is in progress through an academic/industrial collaboration, aimed at obtaining a mobile agent capable to merge in itself the standard characteristics of the Automated Guided Vehicles and some potentialities of the Autonomous Mobile Robots, with a particular care for safety issues. Its HW/SW features, together with its mechanical characteristics, make it potentially applicable both in industrial and research contexts

    Development of a Virtual Collision Sensor for Industrial Robots

    Get PDF
    Collision detection is a fundamental issue for the safety of a robotic cell. While several common methods require specific sensors or the knowledge of the robot dynamic model, the proposed solution is constituted by a virtual collision sensor for industrial manipulators, which requires as inputs only the motor currents measured by the standard sensors that equip a manipulator and the estimated currents provided by an internal dynamic model of the robot (i.e., the one used inside its controller), whose structure, parameters and accuracy are not known. The collision detection is achieved by comparing the absolute value of the current residue with a time-varying, positive-valued threshold function, including an estimate of the model error and a bias term, corresponding to the minimum collision torque to be detected. The value of such a term, defining the sensor sensitivity, can be simply imposed as constant, or automatically customized for a specific robotic application through a learning phase and a subsequent adaptation process, to achieve a more robust and faster collision detection, as well as the avoidance of any false collision warnings, even in case of slow variations of the robot behavior. Experimental results are provided to confirm the validity of the proposed solution, which is already adopted in some industrial scenarios

    Sen3Bot Net: a meta-sensors network to enable smart factories implementation

    Get PDF
    In the near future, an increasing number of mobile agents working closely with human operators is envisaged in smart factories. In industrial human-shared environments that employ traditional Automated Guided Vehicles, safety can be ensured thanks to the support provided by Autonomous Mobile Robots, acting as a net of meta-sensors. The localization and perception information of each meta-sensor is shared among all mobile platforms. In particular, the information about the dynamic detection of human presence is combined and uploaded in a shared map, increasing the awareness of the mobile robots about their surroundings in a specific working area. This paper proposes an architecture that integrates the meta-sensors with an existing net of Automated Guided Vehicles, with the aim of enhancing systems based on outdated mobile agents that seek for Industry 4.0 solutions without the necessity of a complete renewal. Simulations of test scenarios are provided in order to confirm the validity of the proposed architecture model

    Online supervised global path planning for AMRs with human-obstacle avoidance

    Get PDF
    In smart factories, the performance of the production lines is improved thanks to the wide application of mobile robots. In workspaces where human operators and mobile robots coexist, safety is a fundamental factor to be considered. In this context, the motion planning of Autonomous Mobile Robots is a challenging task, since it must take into account the human factor. In this paper, an implementation of a three-level online path planning is proposed, in which a set of waypoints belonging to a safe path is computed by a supervisory planner. Depending on the nature of the detected obstacles during the robot motion, the re-computation of the safe path may be enabled, after the collision avoidance action provided by the local planner is initiated. Particular attention is devoted to the detection and avoidance of human operators. The supervisory planner is triggered as the detected human gets sufficiently close to the mobile robot, allowing it to follow a new safe virtual path while conservatively circumnavigating the operator. The proposed algorithm has been experimentally validated in a laboratory environment emulating industrial scenarios

    PoinTap system: a human-robot interface to enable remotely controlled tasks

    Get PDF
    In the last decades, industrial manipulators have been used to speed up the production process and also to perform tasks that may put humans at risk. Typical interfaces employed to teleoperate the robot are not so intuitive to use. In fact, it takes longer to learn and properly control a robot whose interface is not easy to use, and it may also increase the operator’s stress and mental workload. In this paper, a touchscreen interface for supervised assembly tasks is proposed, using an LCD screen and a hand-tracking sensor. The aim is to provide an intuitive remote controlled system that enables a flexible execution of assembly tasks: high level decisions are entrusted to the human operator while the robot executes pick-and-place operations. A demonstrative industrial case study showcases the system potentiality: it was first tested in simulation, and then experimentally validated using a real robot, in a laboratory environment

    EValueAction: a proposal for policy evaluation in simulation to support interactive imitation learning

    Get PDF
    The up-and-coming concept of Industry 5.0 foresees human-centric flexible production lines, where collaborative robots support human workforce. In order to allow a seamless collaboration between intelligent robots and human workers, designing solutions for non-expert users is crucial. Learning from demonstration emerged as the enabling approach to address such a problem. However, more focus should be put on finding safe solutions which optimize the cost associated with the demonstrations collection process. This paper introduces a preliminary outline of a system, namely EValueAction (EVA), designed to assist the human in the process of collecting interactive demonstrations taking advantage of simulation to safely avoid failures. A policy is pre-trained with human-demonstrations and, where needed, new informative data are interactively gathered and aggregated to iteratively improve the initial policy. A trial case study further reinforces the relevance of the work by demonstrating the crucial role of informative demonstrations for generalization

    Dynamic Path Planning of a mobile robot adopting a costmap layer approach in ROS2

    Get PDF
    Mobile robots can highly contribute to achieve the production flexibility envisaged by the Industry 4.0 paradigm, provided that they show an adequate level of autonomy to operate in a typical industrial environment, in which the presence of both static and dynamic obstacles must be managed. Robot Operating System (ROS) is a well known open-source platform for the development of robotic applications, recently updated to the enhanced ROS2 version, including a navigation stack (Nav2) providing most, but not all the capabilities required to a mobile robot operating in an industrial environment. In particular, it does not embed a strategy for dynamic obstacle handling. Aim of this paper is to enhance Nav2 through the development of a Dynamic Obstacle Layer, as a plug and play solution suitable for the integration of the dynamic obstacle information acquired by a generic 2D LiDAR sensor. The effectiveness of the proposed solution is validated through a campaign of simulation tests, carried out in Webots for a TurtleBot3 burger robot, equipped with a RPLIDAR A3 LiDAR sensor
    • …
    corecore